Collaborative Multi-Agent Video Fast-Forwarding
نویسندگان
چکیده
Multi-agent applications have recently gained significant popularity. In many computer vision tasks, a network of agents, such as team robots with cameras, could work collaboratively to perceive the environment for efficient and accurate situation awareness. However, these agents often limited computation, communication, storage resources. Thus, reducing resource consumption while still providing an perception becomes important goal when deploying multi-agent systems. To achieve this goal, we identify leverage overlap among different camera views in systems processing, transmission redundant/unimportant video frames. Specifically, developed two collaborative fast-forwarding frameworks distributed centralized settings, respectively. frameworks, each individual agent can selectively process or skip frames at adjustable paces based on multiple strategies via reinforcement learning. Multiple then sense either 1) consensus-based framework called DMVF that periodically updates by establishing communication consensus connected neighbors, 2) xmlns:xlink="http://www.w3.org/1999/xlink">MFFNet utilizes central controller decide collected data. We demonstrate efficacy efficiency our proposed real-world surveillance dataset VideoWeb new simulated driving CarlaSim, through extensive simulations deployment embedded platform TCP communication. show compared other approaches literature, better coverage frames, significantly number processed agent.
منابع مشابه
بررسی و تعمیم multi relay parity forwarding
امروزه با گسترش روزافزون ارتباطات و شبکه های مخابراتی در تمام عرصه های زندگی بشر و نیاز مبرم به ارتباطات با نرخ و کیفیت بالا، تحقیقات بنیادی و کاربردی در زمینه ی علم و مهندسی مخابرات جزو اولویت های اصلی جوامع پیشرفته و در حال توسعه است. بنا به ضرورت مذکور در این پایان نامه به بررسی تحقیقات انجام -شده در زمینه تئوری اطلاعات شبکه های مخابراتی خصوصا شبکه های رله در تئوری اطلاعات که بسیار مورد توجه...
15 صفحه اولمدلسازی احساسات در سیستمهای multi-agent یادگیرنده
این پایان نامه به بررسی نقش مثبت یا منفی احساسات روی کارایی عامل های یادگیرنده در یک محیط multi-agent می پردازد. در این راستا مدلی برای عامل های یادگیرنده دارای احساس معرفی می شود. برای بررسی نقش احساسات، یک محیط فرضی multi-agent شبیه سازی شده و حالت های گوناگونی در آن نظر گرفته می شوند. در حالت نخست، کارایی عامل هایی بررسی می شود که دارای احساس نیستند و فقط قابلیت یادگیری دارند. در دومین حالت...
15 صفحه اولFast-Forwarding Crowd Simulations
The processing time to simulate crowds for games or simulations is a real challenge. While the increasing power of processing capacity is a reality in the hardware industry, it also means that more agents, better rendering and most sophisticated Artificial Intelligence (AI) methods can be used, so again the computational time is an issue. Despite the processing cost, in many cases the most inte...
متن کاملCollaborative Annotation of Multi-View Video
We present a novel system for collaborative annotation of multiview video using a web-based application framework. The serverside synchronizes multiple users annotating video. A XML data scheme allows streaming updates to the server and clients so that all clients are updated in real-time. The interface for annotation is also used for visualization of the rich media, with the aim to allow users...
متن کاملDynamic configuration and collaborative scheduling in supply chains based on scalable multi-agent architecture
Due to diversified and frequently changing demands from customers, technological advances and global competition, manufacturers rely on collaboration with their business partners to share costs, risks and expertise. How to take advantage of advancement of technologies to effectively support operations and create competitive advantage is critical for manufacturers to survive. To respond to these...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Multimedia
سال: 2023
ISSN: ['1520-9210', '1941-0077']
DOI: https://doi.org/10.1109/tmm.2023.3275853